- abstraction (a layer between hardware/software hidden to the user)
- innovation: "A new or improved idea, device, product, etc
- prototype: "A proof of concept"
- bandwidth: Transmission capacity measure by bit rate
- latency: Time it takes for a bit to travel from its sender to its receiver.
- protocol: A set of rules governing the exchange or transmission of data between devices
- router: "Traffic Cop"
- packets: Discreet blocks of internet traffic sent between computers & servers as directed by routers.
- Port - one of 64,000 'doors' available to access your computer from the outside world
- Server - A computer designed to process specific data requests from users
- TCP - Transfer Control Protocol - Provides connection information to a specific port on a specific server on the interweb
- IP - Internet Protocol - Provides Name/Address information to a specific server on the interweb
- HTTP: Hyper Text Transfer Protocol
- Root Servers (Manage the DNS system)
- DNS - Domain Name System -- The service that translates URLs to IP addresses
- Redundancy (Backups and Many Paths)
- Fault Tolerance - "a system's ability to continue even when one or more of its components fail"
- Digital Divide - refers to the gap between individuals or communities who have access to and can effectively use digital technologies, such as computers and the internet, and those who do not
- Crowd Sourcing - "the practice of obtaining input or information from a large number of people via the Internet."
- Citizen Science - Providing the public the opportunity to take part in data collection and/or analysis
- Distributed Computing (divide and conquer)
- Sequential Computing (one at a time, please)
- Parallel Computing (split and run!)
WORK O' THE DAY:
Please take a gander at THIS
Notice it isn't the CPU that is dividing up the tasks but actually the designer of the (OS?) software that divides up tasks and assigns them to different CPUs in the SAME computer.
Obviously that only works if your computer has more than one CPU (which is fairly common these days).
The kinda sad fact is that we don't really see these massive "distributed" computer projects these days.
As we will learn when we dive into security, most folks are not really too much ok with leaving their computer turned on and un-attended while connected to the internet (why is that by the way-- be specific in terms of what we've learned about how information flies around the interweb?)
Additionally, folks are generally not too keen with leaving their computer turned on and running an application across the internet for very similar, if not the same, reason.
Why is that too bad? (Or in fact, do you agree that it is?)
Do a search for "Citizen Science" "Distributed Computing" and you'll get a flavor of what I mean.... again, too bad (IMHO)
═══════════════════════════
That doesn't mean that distributing computing is a dead model, however. How else might distributed computing be applied to a problem?
The producers of the movie Titanic were presented with a rather nasty problem. The script called for several instances where current video of the wreck of the Titanic transformed during an amazing animation sequence into the glorious ship in the days and moments before it sank.
That required a massive, massive amount of processing power that was NOT available to them--- unless they wanted to buy time on a Cray Super Computer (not cost effective, and they most definitely didn't want the world to know some of the specifics of what they were trying to achieve).
What did they do? A technical analysis is HERE